1,485 research outputs found

    The fate of non-trivial entanglement under gravitational collapse

    Get PDF
    We analyse the evolution of the entanglement of a non-trivial initial quantum field state (which, for simplicity, has been taken to be a bipartite state made out of vacuum and the first excited state) when it undergoes a gravitational collapse. We carry out this analysis by generalising the tools developed to study entanglement behaviour in stationary scenarios and making them suitable to deal with dynamical spacetimes. We also discuss what kind of problems can be tackled using the formalism spelled out here as well as single out future avenues of research.Comment: 9 pages, 2 figures. v2: Added Journal reference and small changes to match published versio

    Finite Dimension: A Mathematical Tool to Analise Glycans

    Get PDF
    There is a need to develop widely applicable tools to understand glycan organization, diversity and structure. We present a graph-theoretical study of a large sample of glycans in terms of finite dimension, a new metric which is an adaptation to finite sets of the classical Hausdorff "fractal" dimension. Every glycan in the sample is encoded, via finite dimension, as a point of Glycan Space, a new notion introduced in this paper. Two major outcomes were found: (a) the existence of universal bounds that restrict the universe of possible glycans and show, for instance, that the graphs of glycans are a very special type of chemical graph, and (b) how Glycan Space is related to biological domains associated to the analysed glycans. In addition, we discuss briefly how this encoding may help to improve search in glycan databases.Fil: Alonso, Juan Manuel. Universidad Nacional de Cuyo; Argentina. Consejo Nacional de Investigaciones Científicas y Técnicas. Centro Científico Tecnológico Conicet - San Luis. Instituto de Matemática Aplicada de San Luis "Prof. Ezio Marchi". Universidad Nacional de San Luis. Facultad de Ciencias Físico, Matemáticas y Naturales. Instituto de Matemática Aplicada de San Luis "Prof. Ezio Marchi"; ArgentinaFil: Arroyuelo, Agustina. Consejo Nacional de Investigaciones Científicas y Técnicas. Centro Científico Tecnológico Conicet - San Luis. Instituto de Matemática Aplicada de San Luis "Prof. Ezio Marchi". Universidad Nacional de San Luis. Facultad de Ciencias Físico, Matemáticas y Naturales. Instituto de Matemática Aplicada de San Luis "Prof. Ezio Marchi"; ArgentinaFil: Garay, Pablo Germán. Consejo Nacional de Investigaciones Científicas y Técnicas. Centro Científico Tecnológico Conicet - San Luis. Instituto de Matemática Aplicada de San Luis "Prof. Ezio Marchi". Universidad Nacional de San Luis. Facultad de Ciencias Físico, Matemáticas y Naturales. Instituto de Matemática Aplicada de San Luis "Prof. Ezio Marchi"; ArgentinaFil: Martín, Osvaldo Antonio. Consejo Nacional de Investigaciones Científicas y Técnicas. Centro Científico Tecnológico Conicet - San Luis. Instituto de Matemática Aplicada de San Luis "Prof. Ezio Marchi". Universidad Nacional de San Luis. Facultad de Ciencias Físico, Matemáticas y Naturales. Instituto de Matemática Aplicada de San Luis "Prof. Ezio Marchi"; ArgentinaFil: Vila, Jorge Alberto. Consejo Nacional de Investigaciones Científicas y Técnicas. Centro Científico Tecnológico Conicet - San Luis. Instituto de Matemática Aplicada de San Luis "Prof. Ezio Marchi". Universidad Nacional de San Luis. Facultad de Ciencias Físico, Matemáticas y Naturales. Instituto de Matemática Aplicada de San Luis "Prof. Ezio Marchi"; Argentin

    A paradigm shift for socioeconomic justice and health: from focusing on inequalities to aiming at sustainable equity

    Get PDF
    OBJECTIVES: To measure the \u27best possible health for all\u27, incorporating sustainability, and to establish the magnitude of global health inequity. STUDY DESIGN: Observational, retrospective. METHODS: We identified countries with three criteria: (1) a healthy population-life expectancy above world average; (2) living conditions feasible to replicate worldwide-per-capita gross domestic product (GDP-pc) below the world average; and (3) sustainability-per-capita carbon dioxide emissions lower than the planetary pollution boundary. Using these healthy, feasible, and sustainable (HFS) countries as the gold standard, we estimated the burden of global health inequity (BGHiE) in terms of excess deaths, analyzing time-trends (1950-2012) by age, sex, and geographic location. Finally, we defined a global income \u27equity zone\u27 and quantified the economic gap needed to achieve global sustainable health equity. RESULTS: A total of 14 countries worldwide met the HFS criteria. Since 1970, there has been a BGHiE of approximately 17 million avoidable deaths per year ( approximately 40% of all deaths), with 36 life-years-lost per excess death. Young children and women bore a higher BGHiE, and, in recent years, the highest proportion of avoidable deaths occurred in Africa, India, and the Russian Federation. By 2012, the most efficient HFS countries had a GDP-pc/year of USD2,165,whichweproposedasthelowerequityzonethreshold.TheestimatedUSD2,165, which we proposed as the lower equity zone threshold. The estimated USD2.58 trillion economic gap represents 3.6% of the world\u27s GDP-twenty times larger than current total global foreign aid. CONCLUSIONS: Sustainable health equity metrics provide a benchmark tool to guide efforts toward transforming overall living conditions, as a means to achieve the \u27best possible health for all.\u2

    Permissionless Clock Synchronization with Public Setup

    Get PDF
    The permissionless clock synchronization problem asks how it is possible for a population of parties to maintain a system-wide synchronized clock, while their participation rate fluctuates --- possibly very widely --- over time. The underlying assumption is that parties experience the passage of time with roughly the same speed, but however they may disengage and engage with the protocol following arbitrary (and even chosen adversarially) participation patterns. This (classical) problem has received renewed attention due to the advent of blockchain protocols, and recently it has been solved in the setting of proof of stake, i.e., when parties are assumed to have access to a trusted PKI setup [Badertscher et al., Eurocrypt ’21]. In this work, we present the first proof-of-work (PoW)-based permissionless clock synchronization protocol. Our construction assumes a public setup (e.g., a CRS) and relies on an honest majority of computational power that, for the first time, is described in a fine-grain timing model that does not utilize a global clock that exports the current time to all parties. As a secondary result of independent interest, our protocol gives rise to the first PoW-based ledger consensus protocol that does not rely on an external clock for the time-stamping of transactions and adjustment of the PoW difficulty

    Sound and Fine-grain Specification of Ideal Functionalities

    Get PDF
    Nowadays it is widely accepted to formulate the security of a protocol carrying out a given task via the "trusted-party paradigm," where the protocol execution is compared with an ideal process where the outputs are computed by a trusted party that sees all the inputs. A protocol is said to securely carry out a given task if running the protocol with a realistic adversary amounts to "emulating" the ideal process with the appropriate trusted party. In the Universal Composability (UC) framework the program run by the trusted party is called an ideal functionality. While this simulation-based security formulation provides strong security guarantees, its usefulness is contingent on the properties and correct specification of the ideal functionality, which, as demonstrated in recent years by the coexistence of complex, multiple functionalities for the same task as well as by their "unstable" nature, does not seem to be an easy task. In this paper we address this problem, by introducing a general methodology for the sound specification of ideal functionalities. First, we introduce the class of canonical ideal functionalities for a cryptographic task, which unifies the syntactic specification of a large class of cryptographic tasks under the same basic template functionality. Furthermore, this representation enables the isolation of the individual properties of a cryptographic task as separate members of the corresponding class. By endowing the class of canonical functionalities with an algebraic structure we are able to combine basic functionalities to a single final canonical functionality for a given task. Effectively, this puts forth a bottom-up approach for the specification of ideal functionalities: first one defines a set of basic constituent functionalities for the task at hand, and then combines them into a single ideal functionality taking advantage of the algebraic structure. In our framework, the constituent functionalities of a task can be derived either directly or, following a translation strategy we introduce, from existing game-based definitions; such definitions have in many cases captured desired individual properties of cryptographic tasks, albeit in less adversarial settings than universal composition. Our translation methodology entails a sequence of steps that derive a corresponding canonical functionality given a game-based definition. In this way, we obtain a well-defined mapping of game-based security properties to their corresponding UC counterparts. Finally, we demonstrate the power of our approach by applying our methodology to a variety of basic cryptographic tasks, including commitments, digital signatures, zero-knowledge proofs, and oblivious transfer. While in some cases our derived canonical functionalities are equivalent to existing formulations, thus attesting to the validity of our approach, in others they differ, enabling us to "debug" previous definitions and pinpoint their shortcomings

    Unveiling quantum entanglement degradation near a Schwarzschild black hole

    Get PDF
    We analyze the entanglement degradation provoked by the Hawking effect in a bipartite system Alice-Rob when Rob is in the proximities of a Schwarzschild black hole while Alice is free falling into it. We will obtain the limit in which the tools imported from the Unruh entanglement degradation phenomenon can be used properly, keeping control on the approximation. As a result, we will be able to determine the degree of entanglement as a function of the distance of Rob to the event horizon, the mass of the black hole, and the frequency of Rob's entangled modes. By means of this analysis we will show that all the interesting phenomena occur in the vicinity of the event horizon and that the presence of event horizons do not effectively degrade the entanglement when Rob is far off the black hole. The universality of the phenomenon is presented: There are not fundamental differences for different masses when working in the natural unit system adapted to each black hole. We also discuss some aspects of the localization of Alice and Rob states. All this study is done without using the single mode approximation.Comment: 16 pages, 10 figures, revtex4. Added Journal referenc

    Round-Preserving Parallel Composition of Probabilistic-Termination Protocols

    Get PDF
    An important benchmark for multi-party computation protocols (MPC) is their round complexity. For several important MPC tasks, (tight) lower bounds on the round complexity are known. However, for some of these tasks, such as broadcast, the lower bounds can be circumvented when the termination round of every party is not a priori known, and simultaneous termination is not guaranteed. Protocols with this property are called probabilistic-termination (PT) protocols. Running PT protocols in parallel affects the round complexity of the resulting protocol in somewhat unexpected ways. For instance, an execution of m protocols with constant expected round complexity might take O(log m) rounds to complete. In a seminal work, Ben-Or and El-Yaniv (Distributed Computing \u2703) developed a technique for parallel execution of arbitrarily many broadcast protocols, while preserving expected round complexity. More recently, Cohen et al. (CRYPTO \u2716) devised a framework for universal composition of PT protocols, and provided the first composable parallel-broadcast protocol with a simulation-based proof. These constructions crucially rely on the fact that broadcast is ``privacy free,\u27\u27 and do not generalize to arbitrary protocols in a straightforward way. This raises the question of whether it is possible to execute arbitrary PT protocols in parallel, without increasing the round complexity. In this paper we tackle this question and provide both feasibility and infeasibility results. We construct a round-preserving protocol compiler, secure against a dishonest minority of actively corrupted parties, that compiles arbitrary protocols into a protocol realizing their parallel composition, while having a black-box access to the underlying protocols. Furthermore, we prove that the same cannot be achieved, using known techniques, given only black-box access to the functionalities realized by the protocols, unless merely security against semi-honest corruptions is required, for which case we provide a protocol

    Minimal Complete Primitives for Secure Multi-Party Computation

    Get PDF
    The study of minimal cryptographic primitives needed to implement secure computation among two or more players is a fundamental question in cryptography. The issue of complete primitives for the case of two players has been thoroughly studied. However, in the multi-party setting, when there are n > 2 players and t of them are corrupted, the question of what are the simplest complete primitives remained open for t ≥ n/3. (A primitive is called complete if any computation can be carried out by the players having access only to the primitive and local computation.) In this paper we consider this question, and introduce complete primitives of minimal cardinality for secure multi-party computation. The cardinality issue (number of players accessing the primitive) is essential in settings where primitives are implemented by some other means, and the simpler the primitive the easier it is to realize. We show that our primitives are complete and of minimal cardinality possible for most case
    • …
    corecore